AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Hybrid SSM-Transformer architecture

# Hybrid SSM-Transformer architecture

AI21 Jamba Large 1.5
Other
AI21 Jamba 1.5 is a series of advanced foundation models with powerful long context processing capabilities and efficient inference speed, suitable for various business scenarios.
Large Language Model Safetensors Supports Multiple Languages
A
ai21labs
2,642
216
AI21 Jamba Mini 1.5
Other
AI21 Jamba 1.5 Mini is an advanced hybrid SSM-Transformer instruction-following foundation model with efficient long-context processing capabilities and fast inference speed.
Large Language Model Transformers Supports Multiple Languages
A
ai21labs
6,102
269
Jamba V0.1
Apache-2.0
Jamba is a state-of-the-art hybrid SSM-Transformer large language model that combines the advantages of Mamba architecture with Transformer, supporting 256K context length, surpassing models of similar scale in throughput and performance.
Large Language Model Transformers
J
ai21labs
6,247
1,181
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase